DE eng

Search in the Catalogues and Directories

Hits 1 – 11 of 11

1
Differentiable Generative Phonology ...
BASE
Show details
2
Applying the Transformer to Character-level Transduction ...
Wu, Shijie; Cotterell, Ryan; Hulden, Mans. - : ETH Zurich, 2021
BASE
Show details
3
Everything Is All It Takes: A Multipronged Strategy for Zero-Shot Cross-Lingual Information Extraction ...
BASE
Show details
4
Applying the Transformer to Character-level Transduction
In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume (2021)
BASE
Show details
5
Do Explicit Alignments Robustly Improve Multilingual Encoders? ...
Wu, Shijie; Dredze, Mark. - : arXiv, 2020
BASE
Show details
6
SIGMORPHON 2020 Shared Task 0: Typologically Diverse Morphological Inflection ...
BASE
Show details
7
Are All Languages Created Equal in Multilingual BERT? ...
Wu, Shijie; Dredze, Mark. - : arXiv, 2020
BASE
Show details
8
The Paradigm Discovery Problem ...
Erdmann, Alexander; Elsner, Micha; Wu, Shijie. - : ETH Zurich, 2020
BASE
Show details
9
The Paradigm Discovery Problem
In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (2020)
BASE
Show details
10
Emerging Cross-lingual Structure in Pretrained Language Models ...
Abstract: We study the problem of multilingual masked language modeling, i.e. the training of a single model on concatenated text from multiple languages, and present a detailed study of several factors that influence why these models are so effective for cross-lingual transfer. We show, contrary to what was previously hypothesized, that transfer is possible even when there is no shared vocabulary across the monolingual corpora and also when the text comes from very different domains. The only requirement is that there are some shared parameters in the top layers of the multi-lingual encoder. To better understand this result, we also show that representations from independently trained models in different languages can be aligned post-hoc quite effectively, strongly suggesting that, much like for non-contextual word embeddings, there are universal latent symmetries in the learned embedding spaces. For multilingual masked language modeling, these symmetries seem to be automatically discovered and aligned during the ... : ACL 2020 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.1911.01464
https://arxiv.org/abs/1911.01464
BASE
Hide details
11
The SIGMORPHON 2019 Shared Task: Morphological Analysis in Context and Cross-Lingual Transfer for Inflection ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
11
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern